Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available June 1, 2026
-
Free, publicly-accessible full text available January 1, 2026
-
Lower-limb exoskeletons have the potential to transform the way we move1,2,3,4,5,6,7,8,9,10,11,12,13,14, but current state-of-the-art controllers cannot accommodate the rich set of possible human behaviours that range from cyclic and predictable to transitory and unstructured. We introduce a task-agnostic controller that assists the user on the basis of instantaneous estimates of lower-limb biological joint moments from a deep neural network. By estimating both hip and knee moments in-the-loop, our approach provided multi-joint, coordinated assistance through our autonomous, clothing-integrated exoskeleton. When deployed during 28 activities, spanning cyclic locomotion to unstructured tasks (for example, passive meandering and high-speed lateral cutting), the network accurately estimated hip and knee moments with an average R2 of 0.83 relative to ground truth. Further, our approach significantly outperformed a best-case task classifier-based method constructed from splines and impedance parameters. When tested on ten activities (including level walking, running, lifting a 25 lb (roughly 11 kg) weight and lunging), our controller significantly reduced user energetics (metabolic cost or lower-limb biological joint work depending on the task) relative to the zero torque condition, ranging from 5.3 to 19.7%, without any manual controller modifications among activities. Thus, this task-agnostic controller can enable exoskeletons to aid users across a broad spectrum of human activities, a necessity for real-world viability.more » « lessFree, publicly-accessible full text available November 14, 2025
-
A human lower-limb biomechanics and wearable sensors dataset during cyclic and non-cyclic activitiesAbstract Tasks of daily living are often sporadic, highly variable, and asymmetric. Analyzing these real-world non-cyclic activities is integral for expanding the applicability of exoskeletons, protheses, wearable sensing, and activity classification to real life, and could provide new insights into human biomechanics. Yet, currently available biomechanics datasets focus on either highly consistent, continuous, and symmetric activities, such as walking and running, or only a single specific non-cyclic task. To capture a more holistic picture of lower limb movements in everyday life, we collected data from 12 participants performing 20 non-cyclic activities (e.g. sit-to-stand, jumping, squatting, lunging, cutting) as well as 11 cyclic activities (e.g. walking, running) while kinematics (motion capture and IMUs), kinetics (force plates), and electromyography (EMG) were collected. This dataset provides normative biomechanics for a highly diverse range of activities and common tasks from a consistent set of participants and sensors.more » « less
-
Objective: Real-time measurement of biological joint moment could enhance clinical assessments and generalize exoskeleton control. Accessing joint moments outside clinical and laboratory settings requires harnessing non-invasive wearable sensor data for indirect estimation. Previous approaches have been primarily validated during cyclic tasks, such as walking, but these methods are likely limited when translating to non-cyclic tasks where the mapping from kinematics to moments is not unique. Methods: We trained deep learning models to estimate hip and knee joint moments from kinematic sensors, electromyography (EMG), and simulated pressure insoles from a dataset including 10 cyclic and 18 non-cyclic activities. We assessed estimation error on combinations of sensor modalities during both activity types. Results: Compared to the kinematics-only baseline, adding EMG reduced RMSE by 16.9% at the hip and 30.4% at the knee (p<0.05) and adding insoles reduced RMSE by 21.7% at the hip and 33.9% at the knee (p<0.05). Adding both modalities reduced RMSE by 32.5% at the hip and 41.2% at the knee (p<0.05) which was significantly higher than either modality individually (p<0.05). All sensor additions improved model performance on non-cyclic tasks more than cyclic tasks (p<0.05). Conclusion: These results demonstrate that adding kinetic sensor information through EMG or insoles improves joint moment estimation both individually and jointly. These additional modalities are most important during non-cyclic tasks, tasks that reflect the variable and sporadic nature of the real-world. Significance: Improved joint moment estimation and task generalization is pivotal to developing wearable robotic systems capable of enhancing mobility in everyday life.more » « less
-
Mobile manipulation tasks such as opening a door, pulling open a drawer, or lifting a toilet lid require constrained motion of the end-effector under environmental and task constraints. This, coupled with partial information in novel environments, makes it challenging to employ classical motion planning approaches at test time. Our key insight is to cast it as a learning problem to leverage past experience of solving similar planning problems to directly predict motion plans for mobile manipulation tasks in novel situations at test time. To enable this, we develop a simulator, ArtObjSim, that simulates articulated objects placed in real scenes. We then introduce SeqIK+θ0, a fast and flexible representation for motion plans. Finally, we learn models that use SeqIK+θ0 to quickly predict motion plans for articulating novel objects at test time. Experimental evaluation shows improved speed and accuracy at generating motion plans than pure search-based methods and pure learning methods.more » « less
An official website of the United States government

Full Text Available